Accelerating Evolutionary Neural Architecture Search via Multifidelity Evaluation
نویسندگان
چکیده
Evolutionary neural architecture search (ENAS) has recently received increasing attention by effectively finding high-quality architectures, which however consumes high computational cost training the encoded each individual for complete epochs in evaluation. Numerous ENAS approaches have been developed to reduce evaluation cost, but it is often difficult most of these achieve accuracy. To address this issue, article, we propose an accelerated via multifidelity termed MFENAS, where significantly reduced only a small number epochs. The balance between and accuracy well maintained suggesting evaluation, identifies potentially good individuals that cannot survive from previous generations integrating multiple evaluations under different numbers Besides, population initialization strategy devised produce diverse architectures varying ResNet-like Inception-like ones. As shown experiments, proposed MFENAS takes 0.6 GPU days find best holding 2.39% test error rate, superior state-of-the-art approaches. And transferred CIFAR-100 ImageNet also exhibit competitive performance.
منابع مشابه
Accelerating Neural Architecture Search using Performance Prediction
Methods for neural network hyperparameter optimization and meta-modeling are computationally expensive due to the need to train a large number of model configurations. In this paper, we show that standard frequentist regression models can predict the final performance of partially trained model configurations using features based on network architectures, hyperparameters, and time-series valida...
متن کاملEfficient Neural Architecture Search via Parameter Sharing
We propose Efficient Neural Architecture Search (ENAS), a fast and inexpensive approach for automatic model design. In ENAS, a controller discovers neural network architectures by searching for an optimal subgraph within a large computational graph. The controller is trained with policy gradient to select a subgraph that maximizes the expected reward on a validation set. Meanwhile the model cor...
متن کاملCentroidBLAST: Accelerating Sequence Search via Clustering
BLAST, short for Basic Local Alignment Search Tool, searches for regions of local similarity between a query sequence and a large database of DNA or amino-acid sequences. It serves as a fundamental tool to many discovery processes in bioinformatics and computational biology, including inferring functional and evolutionary relationships between sequences, identifying members of gene families, an...
متن کاملProgressive Neural Architecture Search
We propose a method for learning CNN structures that is more efficient than previous approaches: instead of using reinforcement learning (RL) or genetic algorithms (GA), we use a sequential model-based optimization (SMBO) strategy, in which we search for architectures in order of increasing complexity, while simultaneously learning a surrogate function to guide the search, similar to A* search....
متن کاملDifferentiable Neural Network Architecture Search
The successes of deep learning in recent years has been fueled by the development of innovative new neural network architectures. However, the design of a neural network architecture remains a difficult problem, requiring significant human expertise as well as computational resources. In this paper, we propose a method for transforming a discrete neural network architecture space into a continu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Cognitive and Developmental Systems
سال: 2022
ISSN: ['2379-8920', '2379-8939']
DOI: https://doi.org/10.1109/tcds.2022.3179482